Supervised Information Feature Compression Algorithm Based on Divergence Criterion
نویسندگان
چکیده
In this paper, a novel supervised information feature compression algorithm based on divergence criterion is set up. Firstly, according to the information theory, the concept and its properties of the discrete divergence, i.e. average separability information (ASI) is studied, and a concept of symmetry average separability information (SASI) is proposed, and proved that the SASI here is a kind of distance measure, i.e. the SASI satisfies three requests of distance axiomatization, which can be used to measure the difference degree of a two-class problem. Secondly, based on the SASI, a compression theorem is given, and can be used to design information feature compression algorithm. Based on these discussions, we construct a novel supervised information feature compression algorithm based on the average SASI criterion for multi-class. At last, the experimental results demonstrate that the algorithm here is valid and reliable.
منابع مشابه
A Flexible Semi-supervised Feature Extraction Method for Image Classification
This paper proposes a novel discriminant semi-supervised feature extraction for generic classification and recognition tasks. The paper has two main contributions. First, we propose a flexible linear semisupervised feature extraction method that seeks a non-linear subspace that is close to a linear one. The proposed method is based on a criterion that simultaneously exploits the discrimination ...
متن کاملSemi-supervised dimensionality reduction using orthogonal projection divergence-based clustering for hyperspectral imagery
Band clustering and selection are applied to dimensionality reduction of hyperspectral imagery. The proposed method is based on a hierarchical clustering structure, which aims to group bands using an information or similarity measure. Specifically, the distance based on orthogonal projection divergence is used as a criterion for clustering. After clustering, a band selection step is applied to ...
متن کاملFeature Extraction by Non-Parametric Mutual Information Maximization
We present a method for learning discriminative feature transforms using as criterion the mutual information between class labels and transformed features. Instead of a commonly used mutual information measure based on Kullback-Leibler divergence, we use a quadratic divergence measure, which allows us to make an efficient non-parametric implementation and requires no prior assumptions about cla...
متن کاملBayes Optimal Feature Selection for Supervised Learning with General Performance Measures
The problem of feature selection is critical in several areas of machine learning and data analysis. Here we consider feature selection for supervised learning problems, where one wishes to select a small set of features that facilitate learning a good prediction model in the reduced feature space. Our interest is primarily in filter methods that select features independently of the learning al...
متن کاملHypergraph Spectra for Semi-supervised Feature Selection
In many data analysis tasks, one is often confronted with the problem of selecting features from very high dimensional data. Most existing feature selection methods focus on ranking individual features based on a utility criterion, and select the optimal feature set in a greedy manner. However, the feature combinations found in this way do not give optimal classification performance, since they...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2007